Mind Change Optimal Learning of Bayes Net Structure
نویسندگان
چکیده
This paper analyzes the problem of learning the structure of a Bayes net (BN) in the theoretical framework of Gold’s learning paradigm. Bayes nets are one of the most prominent formalisms for knowledge representation and probabilistic and causal reasoning. We follow constraint-based approaches to learning Bayes net structure, where learning is based on observed conditional dependencies between variables of interest (e.g., “X is dependent on Y given any assignment to variable Z”). Applying learning criteria in this model leads to the following results. (1) The mind change complexity of identifying a Bayes net graph over variables V from dependency data is (|V| 2 ) , the maximum number of edges. (2) There is a unique fastest mind-change optimal Bayes net learner; convergence speed is evaluated using Gold’s dominance notion of “uniformly faster convergence”. This learner conjectures a graph if it is the unique Bayes net pattern that satisfies the observed dependencies with a minimum number of edges, and outputs “no guess” otherwise. Therefore we are using standard learning criteria to define a natural and novel Bayes net learning algorithm. We investigate the complexity of computing the output of the fastest mind-change optimal learner, and show that this problem is NP-hard (assuming P = RP). To our knowledge this is the first NP-hardness result concerning the existence of a uniquely optimal Bayes net structure.
منابع مشابه
Mind change optimal learning of Bayes net structure from dependency and independency data
Article history: Received 8 July 2008 Revised 12 March 2009 Available online 3 May 2009 This paper analyzes the problem of learning the structure of a Bayes net in the theoretical framework of Gold’s learning paradigm. Bayes nets are one of the most prominent formalisms for knowledge representation and probabilistic and causal reasoning. We follow constraint-based approaches to learning Bayes n...
متن کاملJoin Bayes Nets: A New Type of Bayes net for Relational Data
Many real-world data are maintained in relational format, with different tables storing information about entities and their links or relationships. The structure (schema) of the database is essentially that of a logical language, with variables ranging over individual entities and predicates for relationships and attributes. Our work combines the graphical structure of Bayes nets with the logi...
متن کاملMind Change Efficient Learning
This paper studies efficient learning with respect to mind changes. Our starting point is the idea that a learner that is efficient with respect to mind changes minimizes mind changes not only globally in the entire learning problem, but also locally in subproblems after receiving some evidence. Formalizing this idea leads to the notion of uniform mind change optimality. We characterize the str...
متن کاملStructured Priors for Structure Learning
Traditional approaches to Bayes net structure learning typically assume little regularity in graph structure other than sparseness. However, in many cases, we expect more systematicity: variables in real-world systems often group into classes that predict the kinds of probabilistic dependencies they participate in. Here we capture this form of prior knowledge in a hierarchical Bayesian framewor...
متن کامل